Home
Research
Publications
Courses
CV
Bio
Contact info

INFO 5325: Ethical Thinking about Digital Technologies and Data
Cornell Tech - INFO 5325
Credits: 2 hours
Fall 2018
Helen Nissenbaum, Information Science: Cornell Tech

About the course:

Digital technologies, including computing, digital media, and data science are integrated into all aspects of contemporary life, from commerce, finance, education, politics, and entertainment to communication, transportation, and social life. They perform key functions and have made great positive contributions to quality of life. This course focuses on the ethical and political these technologies have raised; it studies them through the lens of social, political, and ethical values investigating whether and how technical systems promote or impede values to which we, individually and as societies, are committed -- values, such as liberty, privacy, autonomy, and justice. At the same time as the course introduces students to key technologies, ethical concepts and diverse literatures, you will work individually and in groups, applying what you learn to real and hypothetical cases.

Taken together, the reading, writing, and discussing we do inside and outside the classroom will sharpen students' abilities to reason about ethical issues and social policy, and to grasp some of the deep and subtle connections between the design and performance characteristics of devices and systems, on the one hand, and ethical and political values, on the other. The course places special emphasis on ethical issues arising in the wake of powerful technologies for extracting, amassing, and analyzing data ("big data"), including data mining, machine learning, algorithmic decision-making and control, and AI.

The nature of the course subject matter requires the integration of multidisciplinary material and perspectives, reflected in wide-ranging readings, homework assignments, and discussions. Nevertheless consideration will be given to students' disciplinary and methodological strengths and skills. The course welcomes students with varied backgrounds and skills with prior understanding of, and experience with either computing (e.g. programming, website creation, active blogging, etc.) or social, political, and ethical analysis is recommended. It assumes that students are deeply interested in ethical and political issues relating to digital technologies and digital media as they affect individual lives and societies - issues, such as privacy, intellectual property, freedom of speech, and social justice.


Prerequisites: none.


Readings: Weekly reading assignments will be posted on Course Blackboard.


Grading Elements*:

Participation in class and on discussion board: 20%
Weekly readings comments: 20%
In-class presentation TBD: 10%
Homework assignments: 50%

*A passing grade is needed for each element of the course. Missed deadlines will result in grade deductions at Instructors' discretion.

Students who successfully complete this course will be able to:

- Recognize and describe ethical issues relating to digital technologies.
- Recognize how and to what extent values are implicated in technical artifacts, incidentally or by design.
- Engage critically with everyday technical systems.
- Recognize instances of design that seem to elevate or obstruct certain values.
- Engage actively with values embodied in particular systems or devices so as to recognized alternative designs with differing values implications.
- Engage with fundamental concepts in the philosophy and social study of technology.
- Engage with fundamental concepts in ethical and political theory.
- Critically analyze key social and political issues surrounding contemporary digital information systems and networks, e.g. privacy, intellectual property, freedom of speech.
- Demonstrate conceptually or by prototype the values implications of particular design choices in particular systems.
- Think rigorously, systematically, and critically about values in technical systems and devices.


Approach to Teaching and Learning

Classes will comprise a variety of activity including instructor presentation, group discussion of readings, and individual and small group presentation of mini-projects. Instructor and small group discussions will focus on concepts and arguments drawn from weekly reading assignments. You are encouraged to study together and to discuss information and concepts covered in lecture and the sections with other students.
Cautions
- Permissable cooperation does not include: One student copying all or part of work done by someone else. Or, one student signing the attandance sheet for another student.
- Should copying occur, both the student who copied work from another student and the student who gave material to be copied will both automatically receive a zero for the assignment. Penalty for violation of this Code can also be extended to include failure of the course and University disciplinary action. In general, students in this course are expected to abide by the Cornell University Code of Academic Integrity


Readings and Posting Comments

All required readings are posted on the Class Blackboard. Students should post comments on readings each Tuesday evening before class on Wednesday. Recommended readings (for students wishing to delve deeper into particular issues) are clearly marked.



Schedule

Aug 29 Introduction to the Course

Readings: no assigned readings for first session.

Sep 5

Ethical Thinking: Values in Design
Langdon Winner's claim that technologies have politics has inspired generations of scholars and designers -- those who believe it to be true as well as those who seek to disprove it. Winner's idea serve as a jumping off point for ethical thinking about values in design. Because of its centrality, we study the article, carefully and critically. As you read, ask what views the Winner article contradicts? We consider VID as a tool for analysis and as a guide to practice.

Readings:
1. Winner, L. "Do Artifacts Have Politics?" The Whale and the Reactor. Chicago: The University of Chicago Press, 1986, 19-39
2. Postman, N. "Five Things We Need to Know About Technological Change."


Sep 12

Bias, Fairness, Discrimination - Part 1
Bias and unfair discrimination are among the most serious worries over increasing reliance on computerized control and decision-making. In the first class sessions devoted to this topic (there are others), we will examine older instances of this concern, not applied to big data and AI, the most serious contemporary concern. Through these early cases, however, we will think through the nature of bias, why it is concerning from an ethical perspective, and what is distinctive about bias when it is embodied in technical systems.

Readings:
1. Friedman, B. & Nissenbaum, H. "Bias in Computer Systems." ACM Transactions on Information Systems 14:3 (1996): 330-347
2. Weber, R. "Manufacturing Gender in Military Cockpit Design." The Social Shaping of Technology. Eds. MacKenzie, D. and J. Wajcman. Milton Keynes: Open University Press, 1985.
3. Introna, L. and H. Nissenbaum, "Defining the Web: The Politics of Search Engines," IEEE Computer, 33:1, 54-62


Sep 19

Philosophical Foundations
This week we take a look at some theoretical tools from philosophy that can supply a foundation for ethical reasoning. We'll discuss how we can tell when we're facing an ethical problem and on what grounds we might defend an ethical position. As an example case, we'll consider the decision that Christopher Wylie made to become a whistleblower against Cambridge Analytica.

Readings:
1. Fieser, James. (n.d.) "Ethics." Internet Encyclopedia of Philosophy (Only these excerpts: First two paragraphs; Section 2 Normative Ethics; Section 3a. Normative Principles in Applied Ethics).
2. Cadwalladr, C. (18 March, 2018). "'I made Steve Bannon's psychological warfare tool': meet the data war whistleblower." The Guardian.
3. Also watch the video interview with Christopher Wylie


Sep 26

Privacy - Part I
It is no surprise that privacy has remained one of the central themes associated with digital technologies. These technologies have excelled at monitoring people, recording, storing, and organizing data, and enabling inference and analysis through computational capacities. From data stored in mainframes, to networked servers, online platforms, mobile devices, and networked objects (IoT), the vectors for so doing have grown in breadth and power, expanded into new domains, and enabled practices, of great value to private industry and government. At the same time, they have multiplied threats to privacy. In this class period, small groups will review and discuss a handful of cases that have attracted public scrutiny. To prepare, read the article under Readings and the Cases, focusing on the starred ones.

Readings:
1. Hoofnagle, C.J., Soltani, A., Good, N., Wambach, D.J., and Ayenson, M.D. "Behavioral Advertising: The Offer You Cannot Refuse." Harvard Law & Policy Review 6, 273 (2012).

Cases:
*"An Intentional Mistake: the Anatomy of Google's Wi-Fi Sniffing Debacle"
*"How Google Collected Data from Wi-Fi Networks"
*"Yahoo Scans Emails for Data to Sell"
*"Thirty-One Privacy and Civil Liberties Organizations Urge Google to Suspend Gmail"
*"The Fuss About Gmail and Privacy"
*"How Game Apps that Captivate Kids Have been Collecting Their Data"
*"What Walmart Knows About Customers' Habits"
*"Facebook Delivers a Confident Sales Pitch to Advertisers"
*"You for Sale"
*"Banks and Retailers are Tracking How you Click, Swipe, and Tap"
*"Thermal imaging gets more common but the courts haven't caught up"
*"Accuweather caught sending user location data, even when sharing is off"
*"Why is this company tracking where you are on Thanksgiving?"
*"Google tracks your movements, like it or not"


Oct 3

The Practical Turn: Values at Play I
For some, the idea is obvious: if technical systems and devices embody values, the designers and creators of these technologies should be able to take a proactive stance and think about values in the process of developing technologies. At this pivotal point in the course, we consider how to take values into consideration and at the same time collectively think in more concrete terms about your projects. There are also theoretically inspired challenges to the practical turn, which we will address in later weeks. Privacy will serve as an application focus.

Readings:
1. Flanagan, M. and H. Nissenbaum, Values at Play in Digital Games, Cambridge: MIT Press, 2014. (See Excerpts 1 and 2)
2. Berlin, I. "The Crooked Timber of Humanity." (1991) The Crooked Timber of Humanity: Chapters in the History of Ideas. Ed. H. Hardy. New York: Knopf, 1-19.
3. Perry, J., Macken, E., Scott, N. and J. McKinley. "Disability, Inability, and Cyberspace." Human Values and the Design of Computer Technology. Ed. Batya Friedman. New York: Cambridge University Press, 1997. 65-90.
4. Bentham, J. "Panopticon; or the Inspection House." (1791) The Panopticon Writings. New York: Verso, 1995. 31-37.


Oct 10

Bias, Fairness, Discrimination - Part II
Bias in search engines was the tip of the iceberg. As big data yielded data science and machine learning yielded algorithmic decisions and AI, experts have warned that unfair discrimination is not merely an accidental outcome of poorly thought-out automated decision systems, particularly applied to people in societal and institutional settings, but is possibly an inevitable outcome of such systems. At every stage of development, including data collection, selection, measurement, analysis, and learning to situated implementation unfairness lurks, not least because we live in societies where historical and institutional unfairness is rife. Because this issue has inflamed an enormous body of research, in a very short time, we will manage to dip into only a few of the illustrative cases.

Guest lecturer: Professor Solon Barocas

Readings:
1. Barocas, S., M. Hardt, A. Narayanan (Draft) Fairness and Machine Learning: Limitations and Opportunities, Chapter One (Introduction)

Cases: (Recommended, for now)
Valentino-Devries, J., Singer-Vine, J., and Soltani, A. "Websites Vary Prices, Deals Based on Users' Information." Wall Street Journal, 24 Dec 2012.
"Racism is Poisoning Online Ad Delivery, says Harvard Professor." MIT Technology Review, February 4, 2013.
Miller, C. "Can an Algorithm Hire Better than a Human?" The New York Times. June 28, 2015.
Mann, G. and O'Neil, C. "Hiring Algorithms are Not Neutral." Harvard Business Review. December 9, 2016.


Oct 17

Accountability, Transparency, Explanation
At a minimum, ethics requires that we take responsibility for our actions. In just and decent societies, people who are subjected to the decisions and actions of others are owed explanations and those who hold power over the lives and interests others should be held to account and not be allowed to exercise these powers arbitrarily. As institutional action is increasingly automated via computer and algorithmic systems, these foundational assumptions are increasingly challenged -- as functionality is spread across multiple unfamiliar actors, as algorithmic systems are difficult, if not impossible to comprehend, and as economic and political incentives to automate seem to outweigh ages-old commitments to accountability, transparency, and protections again tyranny.

Readings:
1. Nissenbaum, H. "Accountability in a Computerized Society," in Science and Engineering Ethics, 1996, 2, 25-42
2. Citron, D. and F. Pasquale, "The Scored Society: Due Process for Automated Precisions?" Univ. of Maryland Francis King Carey School of Law, Legal Studies Research Paper, No. 2014-8
3. Bornstein, A. "Is Artificial Intelligence Permanently Inscrutable? Despite new biology-like tools, some insist interpretation is impossible." (2016). Nautilus No. 40.
4. Wakabayashi, D. "Self-Driving Uber Car Kills Pedestrian in Arizona, Where Pedestrians Roam." The New York Times. March 19, 2018.


Oct 24

Privacy - Part II
This session focuses on a family of privacy challenges arising data science and big data. In addition, it includes an overview of the landscape of privacy theory and regulation, with a focus on the theory of privacy as contextual integrity.

Readings:
1. Duhigg, C. "How Companies Learn Your Secrets." The New York Times. Feb 16, 2012.
2. Pangborn, P.J. "Even This Data Guru is Creeped Out by What Anonymous Location Data Reveals About Us," Fast Company, Sept. 9, 2017
3. Al_Jazeera "Terms of Service: Understanding our role in the world of Big Data"
4. Nissenbaum, H. (2015) "'Respect for Context': Fulfilling the Promise of the White House Report," In Privacy in the Modern Age: The Search for Solutions, Eds. M. Rotenberg, J. Horwitz, J. Scott, EPIC, New York: EPIC/The New Press, 152-164.
5. Singer, N. "When a Health Plan Knows How You Shop," The New York Times, July 6, 2014. .


Oct 31

The Practical Turn: Values at Play II
We continue thinking about the role of "conscientious" designers and developers in creating technical systems that embody and promote ethical and political values. We will consider some of the theoretical assumption behind this approach and will work through applications of it to particular cases.

Readings:
1. Flanagan, M. and H. Nissenbaum, Values at Play in Digital Games, Cambridge: MIT Press, 2014. (See Excerpts 1 and 2)
2. Weinberg, A. M. "Can Technology Replace Social Engineering." Controlling Technology: Contemporary Issues. Ed. W. B. Thompson. Buffalo, NY: Prometheus Books, 1991. 41-48.
3. Berlin, I. "The Crooked Timber of Humanity." (1991) The Crooked Timber of Humanity: Chapters in the History of Ideas. Ed. H. Hardy. New York: Knopf, 1-19.
4. Perry, J., Macken, E., Scott, N. and J. McKinley. "Disability, Inability, and Cyberspace." Human Values and the Design of Computer Technology. Ed. Batya Friedman. New York: Cambridge University Press, 1997. 65-90.


Nov 7

Digital Manipulation
We usually think of humans (scientists and engineers) as shapers of technology. But, in turn, technology, architecture, mechanism, and design may shape humans -- affording and constraining certain behaviors, changing our beliefs, provoking critical reflection, encouraging better habits, and making us better people. All well and good, but what about technology that may afford and constrain in questionable ways, that may offer the means of exploitation and manipulation? Humans manipulating one another is probably as old as social life itself; how is it different when enacted or mediated by software?

Readings:
1. Harris T. "How Technology Hijacks People's Minds - from a Magician and Google Design Ethicist." (2016). Medium.com.
2. Scheiber, N. "How Uber Uses Psychological Tricks to Push Its Drivers' Buttons," The New York Times, April 2, 2017.
3. Scheiber, N. "When Websites Won't Take No For an Answer" The New York Times, May 14, 2016.
4. Levin, T. "Facebook told advertisers it can identify teens feeling 'insecure' and 'worthless'." The Guardian. May 1, 2017
5. Paskova, Y. "OKCupid Plays with Love in User Experiments." The New York Times, July 28, 2017.


Nov 14

Resisting Technology with Technology
When is it justifiable to take matters into one's own hands to express protest and resist systems with which one disagrees?

Readings:
1. Gurses, S, R. Overdorf, E. Balsa (2018) "Stirring the POTs: Protective Optimization Technologies" [DRAFT short version]
2. OR, Overdorf, R., B. Kylunych, E. Balsa, C. Troncoso, S. Gurses "POTs: Protective Optimization Technologies," [arXiv: 1806.02711v.3 [cs.CY] 8/30/2018 [longer, more technical]
3. Kantor, J. and S. Hodgson, "Working anything but 9 to 5," The New York Times, August 13, 2014.
4. Marx, G. "A Tack in the Shoe: Neutralizing and Resisting the New Surveillance," Journal of Social Issues, Vol. 59, No.2, 2003, 369-390.
5. https://adnauseam.io/
6. https://cs.nyu.edu/trackmenot/


Nov 21

Thanksgiving Recess


Nov 28

From Content Moderation to Censorship
Many social media companies, including Twitter, Youtube, and Facebook, provide services that require them to make decisions about what content to permit and to prioritize on their platforms. They may ban users for abusive posts, delete messages that promotes terrorism, place restrictions on pornography, recommend relevant content, etc. Some of those decisions are made directly by humans; others are made by algorithms. Regardless, such decisions often involve difficult value tradeoffs, diverse stakeholders, and conflicting interests. To design policies that guide such decisions, companies must consider values like freedom of speech, freedom of association, well-being, social welfare, democracy, public safety and security, as well as other factors, like company goals, legal obligations, obligations to shareholders, reputation, and financial interests. In the face of all these considerations, what content moderation policies should social media companies adopt?

Readings:
1. Marantz, A. "Reddit and the Struggle to Detoxify the Internet," The New Yorker, March 19, 2018.
2. Taub, A. and Fisher, M., "Where Countries Are Tinderboxes and Facebook Is a Match," The New York Times, April 24, 2018.
3. Diresta, R., "Up Next: A better recommendation system," Wired, April 11, 2018.
4. (Read/browse, but don't summarize for the reading reflection): Jack, C. "Lexicon of Lies: Terms for Problematic Information," Data & Society, August 9, 2017.